Skip to content

[ENH] Add LIPOOptimizer backend integration (#102)#242

Open
direkkakkar319-ops wants to merge 12 commits intohyperactive-project:mainfrom
direkkakkar319-ops:issue#102-DirekKakkar-new
Open

[ENH] Add LIPOOptimizer backend integration (#102)#242
direkkakkar319-ops wants to merge 12 commits intohyperactive-project:mainfrom
direkkakkar319-ops:issue#102-DirekKakkar-new

Conversation

@direkkakkar319-ops
Copy link
Copy Markdown

Description

Adds LIPOOptimizer as a new backend in Hyperactive, integrating the lipo package. This gives users a zero-configuration optimizer that requires no tuning of its own hyperparameters, complementing the existing GFO, Optuna, and sklearn backends.

Related Issues

Closes #102

Type of Change

  • [BUG] - Bug fix (non-breaking change fixing an issue)
  • [ENH] - New feature (non-breaking change adding functionality)
  • [DOC] - Documentation changes
  • [MNT] - Maintenance

How was this solved?

  1. Created src/hyperactive/opt/lipo.py with a class LIPOOptimizer wrapper class that bridges lipo's continuous-bounds API into Hyperactive's discrete search space interface
  2. The wrapper parses the search space into lower_bounds, upper_bounds, and categories as required by lipo
  3. A _snap_to_grid method maps lipo's continuous output back to the nearest valid point in the original discrete search space
  4. Best result is retrieved via opt.optimum[0] in src/hyperactive/opt/lipo.py which returns the best params dict.
  5. Exported LIPOOptimizer from src/hyperactive/opt/__init__.py
  6. Added lipo-integration as an optional dependency in pyproject.toml

Checklist

  • PR title includes appropriate tag: [BUG], [ENH], [DOC] or [MNT]
  • Linked to related issue (if applicable)
  • Code passes make check (lint, format, isort)
  • Tests added/updated for changes (if applicable)
  • Documentation updated (if applicable)

Testing

pytest src/hyperactive/tests/test_lipo.py examples/test_examples.py -v -k "lipo"

the test for lipo are pasing locally also

Screenshot 2026-03-24 003840

All 4 tests should pass:
test_lipo_basic
test_lipo_categorical
test_lipo_snap_to_grid
test_example_runs_successfully[examples/lipo/lipo_examples.py]

Additional Notes

lipo's GlobalOptimizer is parameter-free by design, meaning no "tuning of the tuner" is required, which is the key motivation for this integration per issue #102

before the tests we failing locally

Screenshot 2026-03-24 003304

this was due to this code in src/hyperactive/opt/lipo.py

opt.run(self.n_iter)
return self._snap_to_grid(opt.maximum["x"])

which was converted to

opt.run(self.n_iter)
return self._snap_to_grid(opt.optimum[0])

All the tests ran locally - also passing

image

files created or made changes in

README.md
examples/lipo/lipo_examples.py
pyproject.toml
src/hyperactive/opt/__init__.py
src/hyperactive/opt/lipo.py
src/hyperactive/tests/test_lipo.py

@direkkakkar319-ops direkkakkar319-ops marked this pull request as ready for review March 23, 2026 19:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[ENH] exploring integration with lipo

1 participant